Entropy Approximation in Lossy Source Coding Problem

نویسندگان

  • Marek Smieja
  • Jacek Tabor
چکیده

In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lossless Source Coding Lecture Notes & Examples

variously referred to as source coding, lossless coding, noiseless coding or entropy coding exactly and without error, the original message. Lossless coding implies that the message may be decoded back to the original sequence of symbols. The converse of lossless coding (“lossy” coding) implies some degree of approximation of the original message. lossless coding may augment lossy coding, eg VQ...

متن کامل

THE SpEnt METHOD FOR LOSSY SOURCE CODING †

At present, the most successful methods for lossy source compression are sample-function adaptive coders. Prominent examples of these techniques are the still image compression methods utilizing wavelet expansions and tree structures, such as the zero-tree method or the SPIHT algorithm, and variable rate speech coders that allocate bits to parameters within a frame based upon the classification...

متن کامل

Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy-Power

Lossy source coding under the mean-squared error fidelity criterion is considered. The rate-distortion function can be expressed in closed form only for very special cases, including Gaussian sources. The classical upper and lower bounds look exactly alike, except that the upper bound has the source power (variance) whereas the lower bound has the source entropypower. This pleasing duality of p...

متن کامل

Miminum Entropy Set Cover Problem for Lossy Data Compression

Classical minimum entropy set cover problem relies on the finding the most likely assignment between the set of observations and the given set of their types. The solution is described by such partition of data space which minimizes the entropy of the distribution of types. The problem finds a natural application in the machine learning, clustering and data classification. In this paper we show...

متن کامل

Universal Deep Neural Network Compression

Compression of deep neural networks (DNNs) for memoryand computation-efficient compact feature representations becomes a critical problem particularly for deployment of DNNs on resource-limited platforms. In this paper, we investigate lossy compression of DNNs by weight quantization and lossless source coding for memory-efficient inference. Whereas the previous work addressed non-universal scal...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2015